A Three-Term Gradient Descent Method with Subspace Techniques
نویسندگان
چکیده
منابع مشابه
A Three-Term Conjugate Gradient Method with Sufficient Descent Property for Unconstrained Optimization
Conjugate gradient methods are widely used for solving large-scale unconstrained optimization problems, because they do not need the storage of matrices. In this paper, we propose a general form of three-term conjugate gradient methods which always generate a sufficient descent direction. We give a sufficient condition for the global convergence of the proposed general method. Moreover, we pres...
متن کاملStochastic Proximal Gradient Descent with Acceleration Techniques
Proximal gradient descent (PGD) and stochastic proximal gradient descent (SPGD) are popular methods for solving regularized risk minimization problems in machine learning and statistics. In this paper, we propose and analyze an accelerated variant of these methods in the mini-batch setting. This method incorporates two acceleration techniques: one is Nesterov’s acceleration method, and the othe...
متن کاملA Gradient Descent Method for a Neural
| It has been demonstrated that higher order recurrent neu-ral networks exhibit an underlying fractal attractor as an artifact of their dynamics. These fractal attractors ooer a very eecent mechanism to encode visual memories in a neu-ral substrate, since even a simple twelve weight network can encode a very large set of diierent images. The main problem in this memory model, which so far has r...
متن کاملA Perry Descent Conjugate Gradient Method with Restricted Spectrum
A. A new nonlinear conjugate gradient method, based on Perry’s idea, is presented. And it is shown that its sufficient descent property is independent of any line search and the eigenvalues of Pk+1Pk+1 are bounded above, where Pk+1 is the iteration matrix of the new method. Thus, the global convergence is proven by the spectral analysis for nonconvex functions when the line search fulfil...
متن کاملThe Momentum Term in Gradient Descent 11
A momentum term is usually included in the simulations of connectionist learning algorithms. Although it is well known that such a term greatly improves the speed of learning, there have been few rigorous studies of its mechanisms. In this paper, I show that in the limit of continuous time, the momentum parameter is analogous to the mass of Newtonian particles that move through a viscous medium...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematical Problems in Engineering
سال: 2021
ISSN: 1563-5147,1024-123X
DOI: 10.1155/2021/8867309